Learning Bayesian Networks Under Sparsity Constraints: A Parameterized Complexity Analysis

نویسندگان

چکیده

We study the problem of learning structure an optimal Bayesian network when additional constraints are posed on or its moralized graph. More precisely, we consider constraint that graph close, in terms vertex edge deletions, to a sparse class Π. For example, show whose has deletion distance at most k from with maximum degree 1 can be computed polynomial time is constant. This extends previous work gave algorithm such running for edgeless graphs. then further extensions improvements presumably impossible. networks where have 2 connected components size c, c ≥ 3, NP-hard. Finally, edges no f(k) · |I|O(1)-time and that, contrast, arcs 2O(k) |I|O(1) |I| total input size.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Bayesian Networks under Equivalence Constraints (Abstract)

Machine learning tasks typically assume that the examples of a given dataset are independent and identically distributed (i.i.d.). Yet, there are many domains and applications where this assumption does not strictly hold. Further, there may be additional information available that ties together the examples of a dataset, which we could exploit to learn more accurate models. For example, there a...

متن کامل

Learning Topic Models and Latent Bayesian Networks Under Expansion Constraints

Unsupervised estimation of latent variable models is a fundamental problem central to nu-merous applications of machine learning and statistics. This work presents a principled approachfor estimating broad classes of such models, including probabilistic topic models and latent linearBayesian networks, using only second-order observed moments. The sufficient conditions for iden-<...

متن کامل

Parameterized Complexity Results for Exact Bayesian Network Structure Learning

Bayesian network structure learning is the notoriously difficult problem of discovering a Bayesian network that optimally represents a given set of training data. In this paper we study the computational worst-case complexity of exact Bayesian network structure learning under graph theoretic restrictions on the (directed) super-structure. The super-structure is an undirected graph that contains...

متن کامل

Parameterized Complexity and Bayesian Models

Bayesian models are becoming more and more popular. Their use as an engineering tool (e.g., in decision support systems or as classifiers) has been widespread already since the 1990s, but in recent years Bayesian models have enjoyed an enormous popularity in cognitive science as well. In the latter domain, Bayesian techniques are being used to model mental processes that underly cognitive behav...

متن کامل

The Parameterized Complexity of Approximate Inference in Bayesian Networks

Computing posterior and marginal probabilities constitutes the backbone of almost all inferences in Bayesian networks. These computations are known to be intractable in general, both to compute exactly and to approximate by sampling algorithms. While it is well known under what constraints exact computation can be rendered tractable (viz., bounding tree-width of the moralized network and boundi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Artificial Intelligence Research

سال: 2022

ISSN: ['1076-9757', '1943-5037']

DOI: https://doi.org/10.1613/jair.1.13138